61 research outputs found

    Improving statistical inference on pathogen densities estimated by quantitative molecular methods: malaria gametocytaemia as a case study

    Get PDF
    BACKGROUND: Quantitative molecular methods (QMMs) such as quantitative real-time polymerase chain reaction (q-PCR), reverse-transcriptase PCR (qRT-PCR) and quantitative nucleic acid sequence-based amplification (QT-NASBA) are increasingly used to estimate pathogen density in a variety of clinical and epidemiological contexts. These methods are often classified as semi-quantitative, yet estimates of reliability or sensitivity are seldom reported. Here, a statistical framework is developed for assessing the reliability (uncertainty) of pathogen densities estimated using QMMs and the associated diagnostic sensitivity. The method is illustrated with quantification of Plasmodium falciparum gametocytaemia by QT-NASBA. RESULTS: The reliability of pathogen (e.g. gametocyte) densities, and the accompanying diagnostic sensitivity, estimated by two contrasting statistical calibration techniques, are compared; a traditional method and a mixed model Bayesian approach. The latter accounts for statistical dependence of QMM assays run under identical laboratory protocols and permits structural modelling of experimental measurements, allowing precision to vary with pathogen density. Traditional calibration cannot account for inter-assay variability arising from imperfect QMMs and generates estimates of pathogen density that have poor reliability, are variable among assays and inaccurately reflect diagnostic sensitivity. The Bayesian mixed model approach assimilates information from replica QMM assays, improving reliability and inter-assay homogeneity, providing an accurate appraisal of quantitative and diagnostic performance. CONCLUSIONS: Bayesian mixed model statistical calibration supersedes traditional techniques in the context of QMM-derived estimates of pathogen density, offering the potential to improve substantially the depth and quality of clinical and epidemiological inference for a wide variety of pathogens

    Failure of A Novel, Rapid Antigen and Antibody Combination Test to Detect Antigen-Positive HIV Infection in African Adults with Early HIV Infection

    Get PDF
    BACKGROUND: Acute HIV infection (prior to antibody seroconversion) represents a high-risk window for HIV transmission. Development of a test to detect acute infection at the point-of-care is urgent. METHODS: Volunteers enrolled in a prospective study of HIV incidence in four African cities, Kigali in Rwanda and Ndola, Kitwe and Lusaka in Zambia, were tested regularly for HIV by rapid antibody test and p24 antigen ELISA. Five subgroups of samples were also tested by the Determine Ag/Ab Combo test 1) Antigen positive, antibody negative (acute infection); 2) Antigen positive, antibody positive; 3) Antigen negative, antibody positive; 4) Antigen negative, antibody negative; and 5) Antigen false positive, antibody negative (HIV uninfected). A sixth group included serial dilutions from a p24 antigen-positive control sample. Combo test results were reported as antigen positive, antibody positive, or both. RESULTS: Of 34 group 1 samples with VL between 5x105 and >1.5x107 copies/mL (median 3.5x106), 1 (2.9%) was detected by the Combo antigen component, 7 (20.6%) others were positive by the Combo antibody component. No group 2 samples were antigen positive by the Combo test (0/18). Sensitivity of the Combo antigen test was therefore 1.9% (1/52, 95% CI 0.0, 9.9). One false positive Combo antibody result (1/30, 3.3%) was observed in group 4. No false-positive Combo antigen results were observed. The Combo antigen test was positive in group 6 at concentrations of 80 pg/mL, faintly positive at 40 and 20 pg/mL, and negative thereafter. The p24 ELISA antigen test remained positive at 5 pg/mL. CONCLUSIONS: Although the antibody component of the Combo test detected antibodies to HIV earlier than the comparison antibody tests used, less than 2% of the cases of antigen-positive HIV infection were detected by the Combo antigen component. The development of a rapid point-of-care test to diagnose acute HIV infection remains an urgent goal

    IP-10 Levels as an Accurate Screening Tool to Detect Acute HIV Infection in Resource-Limited Settings.

    Get PDF
    Acute HIV infection (AHI) is the period prior to seroconversion characterized by high viral replication, hyper-transmission potential and commonly, non-specific febrile illness. AHI detection requires HIV-RNA viral load (VL) determination, which has very limited access in low-income countries due to restrictive costs and implementation constraints. We sought to identify a biomarker that could enable AHI diagnosis in scarce-resource settings, and to evaluate the feasibility of its implementation. HIV-seronegative adults presenting at the Manhiça District Hospital, Mozambique, with reported-fever were tested for VL. Plasma levels of 49 inflammatory biomarkers from AHI (n = 61) and non-HIV infected outpatients (n = 65) were determined by Luminex and ELISA. IP-10 demonstrated the best predictive power for AHI detection (AUC = 0.88 [95%CI 0.80-0.96]). A cut-off value of IP-10 ≥ 161.6 pg/mL provided a sensitivity of 95.5% (95%CI 85.5-99.5) and a specificity of 76.5% (95%CI 62.5-87.2). The implementation of an IP-10 screening test could avert from 21 to 84 new infections and save from US176,609toUS176,609 to US533,467 to the health system per 1,000 tested patients. We conclude that IP-10 is an accurate biomarker to screen febrile HIV-seronegative individuals for subsequent AHI diagnosis with VL. Such an algorithm is a cost-effective strategy to prevent disease progression and a substantial number of further HIV infections

    Monitoring Virologic Responses to Antiretroviral Therapy in HIV-Infected Adults in Kenya: Evaluation of a Low-Cost Viral Load Assay

    Get PDF
    A key advantage of monitoring HIV viral load (VL) in persons receiving antiretroviral therapy (ART) is the ability to detect virologic failure before clinical deterioration or resistance occurs. Detection of virologic failure will help clarify the need for enhanced adherence counseling or a change to second- line therapy. Low-cost, locally performable alternates to expensive VL assays are needed where resources are limited.We monitored the response to 48-week ART in 100 treatment-naïve Kenyan adults using a low-cost VL measurement, the Cavidi reverse transcriptase (RT) assay and gold-standard assays, Roche RNA PCR and Bayer Versant HIV-1 RNA (bDNA) assays. In Altman-Bland plots, the mean difference in viral loads between the three assays was small (<0.5 log(10) copies/mL). However, the limits of agreement between the methods exceeded the biologically relevant change of 0.5 log copies/ml. Therefore, the RT assay cannot be used interchangeably with the other assays to monitor individual patients. The RT assay was 100% sensitive in detecting viral loads of > or =400 copies/ml compared to gold-standard assays. After 24 weeks of treatment, viral load measured by the RT assay was undetectable in 95% of 65 patients with undetectable RNA PCR VL (<400 copies/ml), 90% of 67 patients with undetectable bDNA VL, and 96% of 57 patients with undetectable VL in both RNA PCR and bDNA assays. The negative predictive value of the RT assay was 100% compared to either assay; the positive predictive value was 86% compared to RNA PCR and 70% compared to bDNA.The RT assay compared well with gold standard assays. Our study highlights the importance of not interchanging viral load assays when monitoring an individual patient. Furthermore, the RT assay may be limited by low positive predictive values when used in populations with low prevalence of virologic failure

    Perinatal acquisition of drug-resistant HIV-1 infection: mechanisms and long-term outcome

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Primary-HIV-1-infection in newborns that occurs under antiretroviral prophylaxis that is a high risk of drug-resistance acquisition. We examine the frequency and the mechanisms of resistance acquisition at the time of infection in newborns.</p> <p>Patients and Methods</p> <p>We studied HIV-1-infected infants born between 01 January 1997 and 31 December 2004 and enrolled in the ANRS-EPF cohort. HIV-1-RNA and HIV-1-DNA samples obtained perinatally from the newborn and mother were subjected to population-based and clonal analyses of drug resistance. If positive, serial samples were obtained from the child for resistance testing.</p> <p>Results</p> <p>Ninety-two HIV-1-infected infants were born during the study period. Samples were obtained from 32 mother-child pairs and from another 28 newborns. Drug resistance was detected in 12 newborns (20%): drug resistance to nucleoside reverse transcriptase inhibitors was seen in 10 cases, non-nucleoside reverse transcriptase inhibitors in two cases, and protease inhibitors in one case. For 9 children, the detection of the same resistance mutations in mothers' samples (6 among 10 available) and in newborn lymphocytes (6/8) suggests that the newborn was initially infected by a drug-resistant strain. Resistance variants were either transmitted from mother-to-child or selected during subsequent temporal exposure under suboptimal perinatal prophylaxis. Follow-up studies of the infants showed that the resistance pattern remained stable over time, regardless of antiretroviral therapy, suggesting the early cellular archiving of resistant viruses. The absence of resistance in the mother of the other three children (3/10) and neonatal lymphocytes (2/8) suggests that the newborns were infected by a wild-type strain without long-term persistence of resistance when suboptimal prophylaxis was stopped.</p> <p>Conclusion</p> <p>This study confirms the importance of early resistance genotyping of HIV-1-infected newborns. In most cases (75%), drug resistance was archived in the cellular reservoir and persisted during infancy, with or without antiretroviral treatment. This finding stresses the need for effective antiretroviral treatment of pregnant women.</p

    Soil Microbial Responses to Elevated CO2 and O3 in a Nitrogen-Aggrading Agroecosystem

    Get PDF
    Climate change factors such as elevated atmospheric carbon dioxide (CO2) and ozone (O3) can exert significant impacts on soil microbes and the ecosystem level processes they mediate. However, the underlying mechanisms by which soil microbes respond to these environmental changes remain poorly understood. The prevailing hypothesis, which states that CO2- or O3-induced changes in carbon (C) availability dominate microbial responses, is primarily based on results from nitrogen (N)-limiting forests and grasslands. It remains largely unexplored how soil microbes respond to elevated CO2 and O3 in N-rich or N-aggrading systems, which severely hinders our ability to predict the long-term soil C dynamics in agroecosystems. Using a long-term field study conducted in a no-till wheat-soybean rotation system with open-top chambers, we showed that elevated CO2 but not O3 had a potent influence on soil microbes. Elevated CO2 (1.5×ambient) significantly increased, while O3 (1.4×ambient) reduced, aboveground (and presumably belowground) plant residue C and N inputs to soil. However, only elevated CO2 significantly affected soil microbial biomass, activities (namely heterotrophic respiration) and community composition. The enhancement of microbial biomass and activities by elevated CO2 largely occurred in the third and fourth years of the experiment and coincided with increased soil N availability, likely due to CO2-stimulation of symbiotic N2 fixation in soybean. Fungal biomass and the fungi∶bacteria ratio decreased under both ambient and elevated CO2 by the third year and also coincided with increased soil N availability; but they were significantly higher under elevated than ambient CO2. These results suggest that more attention should be directed towards assessing the impact of N availability on microbial activities and decomposition in projections of soil organic C balance in N-rich systems under future CO2 scenarios

    Characteristics and management of HIV-1-infected pregnant women enrolled in a randomised trial: differences between Europe and the USA

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Rates of mother-to-child transmission of HIV-1 (MTCT) have historically been lower in European than in American cohort studies, possibly due to differences in population characteristics. The Pediatric AIDS Clinical Trials Group Protocol (PACTG) 316 trial evaluated the effectiveness of the addition of intrapartum/neonatal nevirapine in reducing MTCT in women already receiving antiretroviral prophylaxis. Participation of large numbers of pregnant HIV-infected women from the US and Western Europe enrolling in the same clinical trial provided the opportunity to identify and explore differences in their characteristics and in the use of non-study interventions to reduce MTCT.</p> <p>Methods</p> <p>In this secondary analysis, 1350 women were categorized according to enrollment in centres in the USA (n = 978) or in Europe (n = 372). Factors associated with receipt of highly active antiretroviral therapy and with elective caesarean delivery were identified with logistic regression.</p> <p>Results</p> <p>In Europe, women enrolled were more likely to be white and those of black race were mainly born in Sub-Saharan Africa. Women in the US were younger and more likely to have previous pregnancies and miscarriages and a history of sexually transmitted infections.</p> <p>More than 90% of women did not report symptoms of their HIV infection; however, more women from the US had symptoms (8%), compared to women from Europe (4%). Women in the US were less likely to have HIV RNA levels <400 copies/ml at delivery than women enrolling in Europe, and more likely to receive highly active antiretroviral therapy, and to start therapy earlier in pregnancy. The elective caesarean delivery rate in Europe was 61%, significantly higher than that in the US (22%). Overall, 1.48% of infants were infected and there was no significant difference in the rate of transmission between Europe and the US despite the different approaches to treatment and delivery.</p> <p>Conclusion</p> <p>These findings confirm that there are important historical differences between the HIV-infected pregnant populations in Western Europe and the USA, both in terms of the characteristics of the women and their obstetric and therapeutic management. Although highly active antiretroviral therapy predominates in pregnancy in both settings now, population differences are likely to remain.</p> <p>Trial registration</p> <p>NCT00000869</p

    Perinatal HIV transmission and the cost-effectiveness of screening at 14 weeks gestation, at the onset of labour and the rapid testing of infants

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Preventing HIV transmission is a worldwide public health issue. Vertical transmission of HIV from a mother can be prevented with diagnosis and treatment, but screening incurs cost. The U.S. Virgin Islands follows the mainland policy on antenatal screening for HIV even though HIV prevalence is higher and rates of antenatal care are lower. This leads to many cases of vertically transmitted HIV. A better policy is required for the U.S. Virgin Islands.</p> <p>Methods</p> <p>The objective of this research was to estimate the cost-effectiveness of relevant HIV screening strategies for the antenatal population in the U.S. Virgin Islands. An economic model was used to evaluate the incremental costs and incremental health benefits of nine different combinations of perinatal HIV screening strategies as compared to existing practice from a societal perspective. Three opportunities for screening were considered in isolation and in combination: by 14 weeks gestation, at the onset of labor, or of the infant after birth. The main outcome measure was the cost per life year gained (LYG).</p> <p>Results</p> <p>Results indicate that all strategies would produce benefits and save costs. Universal screening by 14 weeks gestation and screening the infant after birth is the recommended strategy, with cost savings of $1,122,787 and health benefits of 310 LYG. Limitations include the limited research on the variations in screening acceptance of screening based on specimen sample, race and economic status. The benefits of screening after 14 weeks gestation but before the onset of labor were also not addressed.</p> <p>Conclusion</p> <p>This study highlights the benefits of offering screening at different opportunities and repeat screening and raises the question of generalizing these results to other countries with similar characteristics.</p

    Early infant HIV-1 diagnosis programs in resource-limited settings: opportunities for improved outcomes and more cost-effective interventions

    Get PDF
    Early infant diagnosis (EID) of HIV-1 infection confers substantial benefits to HIV-infected and HIV-uninfected infants, to their families, and to programs providing prevention of mother-to-child transmission (PMTCT) services, but has been challenging to implement in resource-limited settings. In order to correctly inform parents/caregivers of infant infection status and link HIV-infected infants to care and treatment, a 'cascade' of events must successfully occur. A frequently cited barrier to expansion of EID programs is the cost of the required laboratory assays. However, substantial implementation barriers, as well as personnel and infrastructure requirements, exist at each step in the cascade. In this update, we review challenges to uptake at each step in the EID cascade, highlighting that even with the highest reported levels of uptake, nearly half of HIV-infected infants may not complete the cascade successfully. We next synthesize the available literature about the costs and cost effectiveness of EID programs; identify areas for future research; and place these findings within the context of the benefits and challenges to EID implementation in resource-limited settings
    corecore